496 resultados para Disease Outbreaks

em Queensland University of Technology - ePrints Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the severe acute respiratory syndrome outbreak in 2003, it has been argued that there has been a substantial revision to the norm dictating the behaviour of states in the event of a disease outbreak. This article examines the evolution of the norm to ‘report and verify’ disease outbreaks and evaluates the extent to which this revised norm has begun to guide state behaviour. Examination of select East Asian countries affected by human infections of the H5N1 (avian influenza) virus strain reveals the need to further understand the mutually constitutive relationship between the value attached to prompt reporting against the capacity to report, and how states manage both in fulfilling their duty to report.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The capacity to conduct international disease outbreak surveillance and share information about outbreaks quickly has empowered both State and Non-State Actors to take an active role in stopping the spread of disease by generating new technical means to identify potential pandemics through the creation of shared reporting platforms. Despite all the rhetoric about the importance of infectious disease surveillance, the concept itself has received relatively little critical attention from academics, practitioners, and policymakers. This book asks leading contributors in the field to engage with five key issues attached to international disease outbreak surveillance - transparency, local engagement, practical needs, integration, and appeal - to illuminate the political effect of these technologies on those who use surveillance, those who respond to surveillance, and those being monitored.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective To evaluate the performance of China’s infectious disease automated alert and response system in the detection of outbreaks of hand, foot and mouth (HFM) disease. Methods We estimated size, duration and delay in reporting HFM disease outbreaks from cases notified between 1 May 2008 and 30 April 2010 and between 1 May 2010 and 30 April 2012, before and after automatic alert and response included HFM disease. Sensitivity, specificity and timeliness of detection of aberrations in the incidence of HFM disease outbreaks were estimated by comparing automated detections to observations of public health staff. Findings The alert and response system recorded 106 005 aberrations in the incidence of HFM disease between 1 May 2010 and 30 April 2012 – a mean of 5.6 aberrations per 100 days in each county that reported HFM disease. The response system had a sensitivity of 92.7% and a specificity of 95.0%. The mean delay between the reporting of the first case of an outbreak and detection of that outbreak by the response system was 2.1 days. Between the first and second study periods, the mean size of an HFM disease outbreak decreased from 19.4 to 15.8 cases and the mean interval between the onset and initial reporting of such an outbreak to the public health emergency reporting system decreased from 10.0 to 9.1 days. Conclusion The automated alert and response system shows good sensitivity in the detection of HFM disease outbreaks and appears to be relatively rapid. Continued use of this system should allow more effective prevention and limitation of such outbreaks in China.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Barmah Forest virus (BFV) disease is one of the most widespread mosquito-borne diseases in Australia. The number of outbreaks and the incidence rate of BFV in Australia have attracted growing concerns about the spatio-temporal complexity and underlying risk factors of BFV disease. A large number of notifications has been recorded continuously in Queensland since 1992. Yet, little is known about the spatial and temporal characteristics of the disease. I aim to use notification data to better understand the effects of climatic, demographic, socio-economic and ecological risk factors on the spatial epidemiology of BFV disease transmission, develop predictive risk models and forecast future disease risks under climate change scenarios. Computerised data files of daily notifications of BFV disease and climatic variables in Queensland during 1992-2008 were obtained from Queensland Health and Australian Bureau of Meteorology, respectively. Projections on climate data for years 2025, 2050 and 2100 were obtained from Council of Scientific Industrial Research Organisation. Data on socio-economic, demographic and ecological factors were also obtained from relevant government departments as follows: 1) socio-economic and demographic data from Australian Bureau of Statistics; 2) wetlands data from Department of Environment and Resource Management and 3) tidal readings from Queensland Department of Transport and Main roads. Disease notifications were geocoded and spatial and temporal patterns of disease were investigated using geostatistics. Visualisation of BFV disease incidence rates through mapping reveals the presence of substantial spatio-temporal variation at statistical local areas (SLA) over time. Results reveal high incidence rates of BFV disease along coastal areas compared to the whole area of Queensland. A Mantel-Haenszel Chi-square analysis for trend reveals a statistically significant relationship between BFV disease incidence rates and age groups (ƒÓ2 = 7587, p<0.01). Semi-variogram analysis and smoothed maps created from interpolation techniques indicate that the pattern of spatial autocorrelation was not homogeneous across the state. A cluster analysis was used to detect the hot spots/clusters of BFV disease at a SLA level. Most likely spatial and space-time clusters are detected at the same locations across coastal Queensland (p<0.05). The study demonstrates heterogeneity of disease risk at a SLA level and reveals the spatial and temporal clustering of BFV disease in Queensland. Discriminant analysis was employed to establish a link between wetland classes, climate zones and BFV disease. This is because the importance of wetlands in the transmission of BFV disease remains unclear. The multivariable discriminant modelling analyses demonstrate that wetland types of saline 1, riverine and saline tidal influence were the most significant risk factors for BFV disease in all climate and buffer zones, while lacustrine, palustrine, estuarine and saline 2 and saline 3 wetlands were less important. The model accuracies were 76%, 98% and 100% for BFV risk in subtropical, tropical and temperate climate zones, respectively. This study demonstrates that BFV disease risk varied with wetland class and climate zone. The study suggests that wetlands may act as potential breeding habitats for BFV vectors. Multivariable spatial regression models were applied to assess the impact of spatial climatic, socio-economic and tidal factors on the BFV disease in Queensland. Spatial regression models were developed to account for spatial effects. Spatial regression models generated superior estimates over a traditional regression model. In the spatial regression models, BFV disease incidence shows an inverse relationship with minimum temperature, low tide and distance to coast, and positive relationship with rainfall in coastal areas whereas in whole Queensland the disease shows an inverse relationship with minimum temperature and high tide and positive relationship with rainfall. This study determines the most significant spatial risk factors for BFV disease across Queensland. Empirical models were developed to forecast the future risk of BFV disease outbreaks in coastal Queensland using existing climatic, socio-economic and tidal conditions under climate change scenarios. Logistic regression models were developed using BFV disease outbreak data for the existing period (2000-2008). The most parsimonious model had high sensitivity, specificity and accuracy and this model was used to estimate and forecast BFV disease outbreaks for years 2025, 2050 and 2100 under climate change scenarios for Australia. Important contributions arising from this research are that: (i) it is innovative to identify high-risk coastal areas by creating buffers based on grid-centroid and the use of fine-grained spatial units, i.e., mesh blocks; (ii) a spatial regression method was used to account for spatial dependence and heterogeneity of data in the study area; (iii) it determined a range of potential spatial risk factors for BFV disease; and (iv) it predicted the future risk of BFV disease outbreaks under climate change scenarios in Queensland, Australia. In conclusion, the thesis demonstrates that the distribution of BFV disease exhibits a distinct spatial and temporal variation. Such variation is influenced by a range of spatial risk factors including climatic, demographic, socio-economic, ecological and tidal variables. The thesis demonstrates that spatial regression method can be applied to better understand the transmission dynamics of BFV disease and its risk factors. The research findings show that disease notification data can be integrated with multi-factorial risk factor data to develop build-up models and forecast future potential disease risks under climate change scenarios. This thesis may have implications in BFV disease control and prevention programs in Queensland.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Since the revisions to the International Health Regulations (IHR) in 2005, much attention has turned to how states, particularly developing states, will address core capacity requirements attached to the revised IHR. Primarily, how will states strengthen their capacity to identify and verify public health emergencies of international concern (PHEIC)? Another important but under-examined aspect of the revised IHR is the empowerment of the World Health Organization (WHO) to act upon non-governmental reports of disease outbreaks. The revised IHR potentially marks a new chapter in the powers of ‘disease intelligence’ and how the WHO may press states to verify an outbreak event. This article seeks to understand whether internet surveillance response programs (ISRPs) are effective in ‘naming and shaming’ states into reporting disease outbreaks.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Over the past decade there has been an increased awareness in the field of international relations of the potential impact of an infectious disease epidemic on national security. While states’ attempts to combat infectious disease have a long history, what is new in this area is the adoption at the international level of securitized responses regarding the containment of infectious disease. This article argues that the securitization of infectious disease by states and the World Health Organization (WHO) has led to two key developments. First, the WHO has had to assert itself as the primary actor that all states, particularly western states, can rely upon to contain the threat of infectious diseases. The WHO's apparent success in this is evidenced by the development of the Global Outbreak Alert Response Network (GOARN), which has led to arguments that the WHO has emerged as the key authority in global health governance. The second outcome that this article seeks to explore is the development of the WHO's authority in the area of infectious disease surveillance. In particular, is GOARN a representation of the WHO's consummate authority in the area of coordinating infectious disease response or is GOARN the product of the WHO's capitulation to western states’ concerns with preventing infectious disease outbreaks from reaching their borders and as a result, are arguments expressing the authority of the WHO in infectious disease response premature?

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Since the outbreak of Severe Acute Respiratory Syndrome (SARS) in 2003, there has been much discussion about whether the international community has moved into a new post-Westphalian era, where states increasingly recognize certain shared norms that guide what they ought to do in responding to infectious disease outbreaks. In this article I identify this new obligation as the ‘duty to report’, and examine competing accounts on the degree to which states appreciate this new obligation are considered by examining state behaviour during the H5N1 human infectious outbreaks in East Asia (since 2004). The article examines reporting behaviour for H5N1 human infectious cases in Cambodia, China, Indonesia, Thailand and Vietnam from 2004 to 2010. The findings lend strong support to the claim that East Asian states have come to accept and comply with the duty to report infectious disease outbreaks and that the assertions of sovereignty in response to global health governance frameworks have not systematically inhibited reporting compliance.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Background Detection of outbreaks is an important part of disease surveillance. Although many algorithms have been designed for detecting outbreaks, few have been specifically assessed against diseases that have distinct seasonal incidence patterns, such as those caused by vector-borne pathogens. Methods We applied five previously reported outbreak detection algorithms to Ross River virus (RRV) disease data (1991-2007) for the four local government areas (LGAs) of Brisbane, Emerald, Redland and Townsville in Queensland, Australia. The methods used were the Early Aberration Reporting System (EARS) C1, C2 and C3 methods, negative binomial cusum (NBC), historical limits method (HLM), Poisson outbreak detection (POD) method and the purely temporal SaTScan analysis. Seasonally-adjusted variants of the NBC and SaTScan methods were developed. Some of the algorithms were applied using a range of parameter values, resulting in 17 variants of the five algorithms. Results The 9,188 RRV disease notifications that occurred in the four selected regions over the study period showed marked seasonality, which adversely affected the performance of some of the outbreak detection algorithms. Most of the methods examined were able to detect the same major events. The exception was the seasonally-adjusted NBC methods that detected an excess of short signals. The NBC, POD and temporal SaTScan algorithms were the only methods that consistently had high true positive rates and low false positive and false negative rates across the four study areas. The timeliness of outbreak signals generated by each method was also compared but there was no consistency across outbreaks and LGAs. Conclusions This study has highlighted several issues associated with applying outbreak detection algorithms to seasonal disease data. In lieu of a true gold standard, a quantitative comparison is difficult and caution should be taken when interpreting the true positives, false positives, sensitivity and specificity.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In the age of air travel and globalized trade, pathogens that once took months or even years to spread beyond their regions of origin can now circumnavigate the globe in a matter of hours. Amid growing concerns about such epidemics as Ebola, SARS, MERS, and H1N1, disease diplomacy has emerged as a key foreign and security policy concern as countries work to collectively strengthen the global systems of disease surveillance and control. The revision of the International Health Regulations (IHR), eventually adopted by the World Health Organization’s member states in 2005, was the foremost manifestation of this novel diplomacy. The new regulations heralded a profound shift in international norms surrounding global health security, significantly expanding what is expected of states in the face of public health emergencies and requiring them to improve their capacity to detect and contain outbreaks. Drawing on Martha Finnemore and Kathryn Sikkink’s "norm life cycle" framework and based on extensive documentary analysis and key informant interviews, Disease Diplomacy traces the emergence of these new norms of global health security, the extent to which they have been internalized by states, and the political and technical constraints governments confront in attempting to comply with their new international obligations. The authors also examine in detail the background, drafting, adoption, and implementation of the IHR while arguing that the very existence of these regulations reveals an important new understanding: that infectious disease outbreaks and their management are critical to national and international security. The book will be of great interest to academic researchers, postgraduate students, and advanced undergraduates in the fields of global public health, international relations, and public policy, as well as health professionals, diplomats, and practitioners with a professional interest in global health security.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The capacity to conduct international disease outbreak surveillance and share information about outbreaks quickly has empowered both State and Non-State Actors to take an active role in stopping the spread of disease by generating new technical means to identify potential pandemics through the creation of shared reporting platforms. Despite all the rhetoric about the importance of infectious disease surveillance, the concept itself has received relatively little critical attention from academics, practitioners, and policymakers. This book asks leading contributors in the field to engage with five key issues attached to international disease outbreak surveillance - transparency, local engagement, practical needs, integration, and appeal - to illuminate the political effect of these technologies on those who use surveillance, those who respond to surveillance, and those being monitored.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The mud crab (Scylla spp.) aquaculture industry has expanded rapidly in recent years in many countries in the Indo - West Pacific (IWP) region as an alternative to marine shrimp culture because of significant disease outbreaks and associated failures of many shrimp culture industries in the region. Currently, practices used to produce and manage breeding crabs in hatcheries may compromise levels of genetic diversity, ultimately compromising growth rates, disease resistance and stock productivity. Therefore, to avoid “genetic pollution” and its harmful effects and to promote further development of mud crab aquaculture and fisheries in a sustainable way, a greater understanding of the genetic attributes of wild and cultured mud crab stocks is required. Application of these results can provide benefits for managing wild and cultured Asian mud crab populations for multiple purposes including for commercial production, recreation and conservation and to increase profitability and sustainability of newly emerging crab culture industries. Phylogeographic patterns and the genetic structure of Asian mud crab populations across the IWP were assessed to determine if they were concordant with those of other widespread taxa possessing pelagic larvae of relatively long duration. A 597 bp fragment of the mitochondrial DNA COI gene was amplified and screened for variation in a total of 297 individuals of S. paramamosain from six sampling sites across the species’ natural geographical distribution in the IWP and 36 unique haplotypes were identified. Haplotype diversities per site ranged from 0.516 to 0.879. Nucleotide diversity estimates among haplotypes were 0.11% – 0.48%. Maximum divergence observed among S. paramamosain samples was 1.533% and samples formed essentially a single monophyletic group as no obvious clades were related to geographical location of sites. A weak positive relationship was observed however, between genetic distance and geographical distance among sites. Microsatellite markers were then used to assess contemporary gene flow and population structure in Asian mud crab populations sampled across their natural distribution in the IWP. Eight microsatellite loci were screened in sampled S. paramamosain populations and all showed high allelic diversity at all loci in sampled populations. In total, 344 individuals were analysed, and 304 microsatellite alleles were found across the 8 loci. The mean number of alleles per locus at each site ranged from 20.75 to 28.25. Mean allelic richness per site varied from 17.2 to 18.9. All sites showed high levels of heterozygosity as average expected heterozygosities for all loci ranged from 0.917 – 0.953 while mean observed heterozygosity ranged from 0.916 – 0.959. Allele diversities were similar at all sites and across all loci. The results did not show any evidence for major differences in allele frequencies among sites and patterns of allele frequencies were very similar in all populations across all loci. Estimates of population differentiation (FST) were relatively low and most probably largely reflect intra – individual variation for very highly variable loci. Results from nDNA analysis showed evidence for only very limited population genetic structure among sampled S. paramamosain, and a positive and significant association for genetic and geographical distance among sample sites. Microsatellite markers were then employed to determine if adequate levels of genetic diversity has been captured in crab hatcheries for the breeding cycle. The results showed that all microsatellite loci were polymorphic in hatchery samples. Culture populations were in general, highly genetically depauperate, compared with comparable wild populations, with only 3 to 8 alleles recorded for the same loci set per population. In contrast, very high numbers of alleles per locus were found in reference wild S. paramamosain populations, which ranged from 18 to 46 alleles per locus per population. In general, this translates into a 3 to 10 fold decline in mean allelic richness per locus in all culture stocks compared with wild reference counterparts. Furthermore, most loci in all cultured S. paramamosain samples showed departures from HWE equilibrium. Allele frequencies were very different in culture samples from that present in comparable wild reference samples and this in particular, was reflected in a large decline in allele diversity per locus. The pattern observed was best explained by significant impacts of breeding practices employed in hatcheries rather than natural differentiation among wild populations used as the source of brood stock. Recognition of current problems and management strategies for the species both for the medium and long-term development of the new culture industry are discussed. The priority research to be undertaken over the medium term for S. paramamosain should be to close the life cycle fully to allow individuals to be bred on demand and their offspring equalised to control broodstock reproductive contributions. Establishing a broodstock register and pedigree mating system will be required before any selection program is implemented. This will ensure that sufficient genetic variation will be available to allow genetic gains to be sustainably achieved in a future stock improvement program. A fundamental starting point to improve hatchery practices will be to encourage farmers and hatchery managers to spawn more females in their hatcheries as it will increase background genetic diversity in culture stocks. Combining crablet cohorts from multiple hatcheries into a single cohort for supply to farmers or rotation of breeding females regularly in hatcheries will help to address immediate genetic diversity problems in culture stocks. Application of these results can provide benefits for managing wild and cultured Asian mud crab populations more efficiently. Over the long-term, application of data on genetic diversity in wild and cultured stocks of Asian mud crab will contribute to development of sustainable and productive culture industries in Vietnam and other countries in the IWP and can contribute towards conservation of wild genetic resources.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Humankind has been dealing with all kinds of disasters since the dawn of time. The risk and impact of disasters producing mass casualties worldwide is increasing, due partly to global warming as well as to increased population growth, increased density and the aging population. China, as a country with a large population, vast territory, and complex climatic and geographical conditions, has been plagued by all kinds of disasters. Disaster health management has traditionally been a relatively arcane discipline within public health. However, SARS, Avian Influenza, and earthquakes and floods, along with the need to be better prepared for the Olympic Games in China has brought disasters, their management and their potential for large scale health consequences on populations to the attention of the public, the government and the international community alike. As a result significant improvements were made to the disaster management policy framework, as well as changes to systems and structures to incorporate an improved disaster management focus. This involved the upgrade of the Centres for Disease Control and Prevention (CDC) throughout China to monitor and better control the health consequences particularly of infectious disease outbreaks. However, as can be seen in the Southern China Snow Storm and Wenchuan Earthquake in 2008, there remains a lack of integrated disaster management and efficient medical rescue, which has been costly in terms of economics and health for China. In the context of a very large and complex country, there is a need to better understand whether these changes have resulted in effective management of the health impacts of such incidents. To date, the health consequences of disasters, particularly in China, have not been a major focus of study. The main aim of this study is to analyse and evaluate disaster health management policy in China and in particular, its ability to effectively manage the health consequences of disasters. Flood has been selected for this study as it is a common and significant disaster type in China and throughout the world. This information will then be used to guide conceptual understanding of the health consequences of floods. A secondary aim of the study is to compare disaster health management in China and Australia as these countries differ in their length of experience in having a formalised policy response. The final aim of the study is to determine the extent to which Walt and Gilson’s (1994) model of policy explains how disaster management policy in China was developed and implemented after SARS in 2003 to the present day. This study has utilised a case study methodology. A document analysis and literature search of Chinese and English sources was undertaken to analyse and produce a chronology of disaster health management policy in China. Additionally, three detailed case studies of flood health management in China were undertaken along with three case studies in Australia in order to examine the policy response and any health consequences stemming from the floods. A total of 30 key international disaster health management experts were surveyed to identify fundamental elements and principles of a successful policy framework for disaster health management. Key policy ingredients were identified from the literature, the case-studies and the survey of experts. Walt and Gilson (1994)’s policy model that focuses on the actors, content, context and process of policy was found to be a useful model for analysing disaster health management policy development and implementation in China. This thesis is divided into four parts. Part 1 is a brief overview of the issues and context to set the scene. Part 2 examines the conceptual and operational context including the international literature, government documents and the operational environment for disaster health management in China. Part 3 examines primary sources of information to inform the analysis. This involves two key studies: • A comparative analysis of the management of floods in China and Australia • A survey of international experts in the field of disaster management so as to inform the evaluation of the policy framework in existence in China and the criteria upon which the expression of that policy could be evaluated Part 4 describes the key outcomes of this research which include: • A conceptual framework for describing the health consequences of floods • A conceptual framework for disaster health management • An evaluation of the disaster health management policy and its implementation in China. The research outcomes clearly identified that the most significant improvements are to be derived from improvements in the generic management of disasters, rather than the health aspects alone. Thus, the key findings and recommendations tend to focus on generic issues. The key findings of this research include the following: • The health consequences of floods may be described in terms of time as ‘immediate’, ‘medium term’ and ‘long term’ and also in relation to causation as ‘direct’ and ‘indirect’ consequences of the flood. These two aspects form a matrix which in turn guides management responses. • Disaster health management in China requires a more comprehensive response throughout the cycle of prevention, preparedness, response and recovery but it also requires a more concentrated effort on policy implementation to ensure the translation of the policy framework into effective incident management. • The policy framework in China is largely of international standard with a sound legislative base. In addition the development of the Centres for Disease Control and Prevention has provided the basis for a systematic approach to health consequence management. However, the key weaknesses in the current system include: o The lack of a key central structure to provide the infrastructure with vital support for policy development, implementation and evaluation. o The lack of well-prepared local response teams similar to local government based volunteer groups in Australia. • The system lacks structures to coordinate government action at the local level. The result of this is a poorly coordinated local response and lack of clarity regarding the point at which escalation of the response to higher levels of government is advisable. These result in higher levels of risk and negative health impacts. The key recommendations arising from this study are: 1. Disaster health management policy in China should be enhanced by incorporating disaster management considerations into policy development, and by requiring a disaster management risk analysis and disaster management impact statement for development proposals. 2. China should transform existing organizations to establish a central organisation similar to the Federal Emergency Management Agency (FEMA) in the USA or the Emergency Management Australia (EMA) in Australia. This organization would be responsible for leading nationwide preparedness through planning, standards development, education and incident evaluation and to provide operational support to the national and local government bodies in the event of a major incident. 3. China should review national and local plans to reflect consistency in planning, and to emphasize the advantages of the integrated planning process. 4. Enhance community resilience through community education and the development of a local volunteer organization. China should develop a national strategy which sets direction and standards in regard to education and training, and requires system testing through exercises. Other initiatives may include the development of a local volunteer capability with appropriate training to assist professional response agencies such as police and fire services in a major incident. An existing organisation such as the Communist Party may be an appropriate structure to provide this response in a cost effective manner. 5. Continue development of professional emergency services, particularly ambulance, to ensure an effective infrastructure is in place to support the emergency response in disasters. 6. Funding for disaster health management should be enhanced, not only from government, but also from other sources such as donations and insurance. It is necessary to provide a more transparent mechanism to ensure the funding is disseminated according to the needs of the people affected. 7. Emphasis should be placed on prevention and preparedness, especially on effective disaster warnings. 8. China should develop local disaster health management infrastructure utilising existing resources wherever possible. Strategies for enhancing local infrastructure could include the identification of local resources (including military resources) which could be made available to support disaster responses. It should develop operational procedures to access those resources. Implementation of these recommendations should better position China to reduce the significant health consequences experienced each year from major incidents such as floods and to provide an increased level of confidence to the community about the country’s capacity to manage such events.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Floods are the most common type of disaster globally, responsible for almost 53,000 deaths in the last decade alone (23:1 low- versus high-income countries). This review assessed recent epidemiological evidence on the impacts of floods on human health. Published articles (2004–2011) on the quantitative relationship between floods and health were systematically reviewed. 35 relevant epidemiological studies were identified. Health outcomes were categorized into short- and long-term and were found to depend on the flood characteristics and people's vulnerability. It was found that long-term health effects are currently not well understood. Mortality rates were found to increase by up to 50% in the first year post-flood. After floods, it was found there is an increased risk of disease outbreaks such as hepatitis E, gastrointestinal disease and leptospirosis, particularly in areas with poor hygiene and displaced populations. Psychological distress in survivors (prevalence 8.6% to 53% two years post-flood) can also exacerbate their physical illness. There is a need for effective policies to reduce and prevent flood-related morbidity and mortality. Such steps are contingent upon the improved understanding of potential health impacts of floods. Global trends in urbanization, burden of disease, malnutrition and maternal and child health must be better reflected in flood preparedness and mitigation programs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Efficient management of domestic wastewater is a primary requirement for human well being. Failure to adequately address issues of wastewater collection, treatment and disposal can lead to adverse public health and environmental impacts. The increasing spread of urbanisation has led to the conversion of previously rural land into urban developments and the more intensive development of semi urban areas. However the provision of reticulated sewerage facilities has not kept pace with this expansion in urbanisation. This has resulted in a growing dependency on onsite sewage treatment. Though considered only as a temporary measure in the past, these systems are now considered as the most cost effective option and have become a permanent feature in some urban areas. This report is the first of a series of reports to be produced and is the outcome of a research project initiated by the Brisbane City Council. The primary objective of the research undertaken was to relate the treatment performance of onsite sewage treatment systems with soil conditions at site, with the emphasis being on septic tanks. This report consists of a ‘state of the art’ review of research undertaken in the arena of onsite sewage treatment. The evaluation of research brings together significant work undertaken locally and overseas. It focuses mainly on septic tanks in keeping with the primary objectives of the project. This report has acted as the springboard for the later field investigations and analysis undertaken as part of the project. Septic tanks still continue to be used widely due to their simplicity and low cost. Generally the treatment performance of septic tanks can be highly variable due to numerous factors, but a properly designed, operated and maintained septic tank can produce effluent of satisfactory quality. The reduction of hydraulic surges from washing machines and dishwashers, regular removal of accumulated septage and the elimination of harmful chemicals are some of the practices that can improve system performance considerably. The relative advantages of multi chamber over single chamber septic tanks is an issue that needs to be resolved in view of the conflicting research outcomes. In recent years, aerobic wastewater treatment systems (AWTS) have been gaining in popularity. This can be mainly attributed to the desire to avoid subsurface effluent disposal, which is the main cause of septic tank failure. The use of aerobic processes for treatment of wastewater and the disinfection of effluent prior to disposal is capable of producing effluent of a quality suitable for surface disposal. However the field performance of these has been disappointing. A significant number of these systems do not perform to stipulated standards and quality can be highly variable. This is primarily due to houseowner neglect or ignorance of correct operational and maintenance procedures. The other problems include greater susceptibility to shock loadings and sludge bulking. As identified in literature a number of design features can also contribute to this wide variation in quality. The other treatment processes in common use are the various types of filter systems. These include intermittent and recirculating sand filters. These systems too have their inherent advantages and disadvantages. Furthermore as in the case of aerobic systems, their performance is very much dependent on individual houseowner operation and maintenance practices. In recent years the use of biofilters has attracted research interest and particularly the use of peat. High removal rates of various wastewater pollutants have been reported in research literature. Despite these satisfactory results, leachate from peat has been reported in various studies. This is an issue that needs further investigations and as such biofilters can still be considered to be in the experimental stage. The use of other filter media such as absorbent plastic and bark has also been reported in literature. The safe and hygienic disposal of treated effluent is a matter of concern in the case of onsite sewage treatment. Subsurface disposal is the most common and the only option in the case of septic tank treatment. Soil is an excellent treatment medium if suitable conditions are present. The processes of sorption, filtration and oxidation can remove the various wastewater pollutants. The subsurface characteristics of the disposal area are among the most important parameters governing process performance. Therefore it is important that the soil and topographic conditions are taken into consideration in the design of the soil absorption system. Seepage trenches and beds are the common systems in use. Seepage pits or chambers can be used where subsurface conditions warrant, whilst above grade mounds have been recommended for a variety of difficult site conditions. All these systems have their inherent advantages and disadvantages and the preferable soil absorption system should be selected based on site characteristics. The use of gravel as in-fill for beds and trenches is open to question. It does not contribute to effluent treatment and has been shown to reduce the effective infiltrative surface area. This is due to physical obstruction and the migration of fines entrained in the gravel, into the soil matrix. The surface application of effluent is coming into increasing use with the advent of aerobic treatment systems. This has the advantage that treatment is undertaken on the upper soil horizons, which is chemically and biologically the most effective in effluent renovation. Numerous research studies have demonstrated the feasibility of this practice. However the overriding criteria is the quality of the effluent. It has to be of exceptionally good quality in order to ensure that there are no resulting public health impacts due to aerosol drift. This essentially is the main issue of concern, due to the unreliability of the effluent quality from aerobic systems. Secondly, it has also been found that most householders do not take adequate care in the operation of spray irrigation systems or in the maintenance of the irrigation area. Under these circumstances surface disposal of effluent should be approached with caution and would require appropriate householder education and stringent compliance requirements. However despite all this, the efficiency with which the process is undertaken will ultimately rest with the individual householder and this is where most concern rests. Greywater too should require similar considerations. Surface irrigation of greywater is currently being permitted in a number of local authority jurisdictions in Queensland. Considering the fact that greywater constitutes the largest fraction of the total wastewater generated in a household, it could be considered to be a potential resource. Unfortunately in most circumstances the only pretreatment that is required to be undertaken prior to reuse is the removal of oil and grease. This is an issue of concern as greywater can considered to be a weak to medium sewage as it contains primary pollutants such as BOD material and nutrients and may also include microbial contamination. Therefore its use for surface irrigation can pose a potential health risk. This is further compounded by the fact that most householders are unaware of the potential adverse impacts of indiscriminate greywater reuse. As in the case of blackwater effluent reuse, there have been suggestions that greywater should also be subjected to stringent guidelines. Under these circumstances the surface application of any wastewater requires careful consideration. The other option available for the disposal effluent is the use of evaporation systems. The use of evapotranspiration systems has been covered in this report. Research has shown that these systems are susceptible to a number of factors and in particular to climatic conditions. As such their applicability is location specific. Also the design of systems based solely on evapotranspiration is questionable. In order to ensure more reliability, the systems should be designed to include soil absorption. The successful use of these systems for intermittent usage has been noted in literature. Taking into consideration the issues discussed above, subsurface disposal of effluent is the safest under most conditions. This is provided the facility has been designed to accommodate site conditions. The main problem associated with subsurface disposal is the formation of a clogging mat on the infiltrative surfaces. Due to the formation of the clogging mat, the capacity of the soil to handle effluent is no longer governed by the soil’s hydraulic conductivity as measured by the percolation test, but rather by the infiltration rate through the clogged zone. The characteristics of the clogging mat have been shown to be influenced by various soil and effluent characteristics. Secondly, the mechanisms of clogging mat formation have been found to be influenced by various physical, chemical and biological processes. Biological clogging is the most common process taking place and occurs due to bacterial growth or its by-products reducing the soil pore diameters. Biological clogging is generally associated with anaerobic conditions. The formation of the clogging mat provides significant benefits. It acts as an efficient filter for the removal of microorganisms. Also as the clogging mat increases the hydraulic impedance to flow, unsaturated flow conditions will occur below the mat. This permits greater contact between effluent and soil particles thereby enhancing the purification process. This is particularly important in the case of highly permeable soils. However the adverse impacts of the clogging mat formation cannot be ignored as they can lead to significant reduction in the infiltration rate. This in fact is the most common cause of soil absorption systems failure. As the formation of the clogging mat is inevitable, it is important to ensure that it does not impede effluent infiltration beyond tolerable limits. Various strategies have been investigated to either control clogging mat formation or to remediate its severity. Intermittent dosing of effluent is one such strategy that has attracted considerable attention. Research conclusions with regard to short duration time intervals are contradictory. It has been claimed that the intermittent rest periods would result in the aerobic decomposition of the clogging mat leading to a subsequent increase in the infiltration rate. Contrary to this, it has also been claimed that short duration rest periods are insufficient to completely decompose the clogging mat, and the intermediate by-products that form as a result of aerobic processes would in fact lead to even more severe clogging. It has been further recommended that the rest periods should be much longer and should be in the range of about six months. This entails the provision of a second and alternating seepage bed. The other concepts that have been investigated are the design of the bed to meet the equilibrium infiltration rate that would eventuate after clogging mat formation; improved geometry such as the use of seepage trenches instead of beds; serial instead of parallel effluent distribution and low pressure dosing of effluent. The use of physical measures such as oxidation with hydrogen peroxide and replacement of the infiltration surface have been shown to be only of short-term benefit. Another issue of importance is the degree of pretreatment that should be provided to the effluent prior to subsurface application and the influence exerted by pollutant loadings on the clogging mat formation. Laboratory studies have shown that the total mass loadings of BOD and suspended solids are important factors in the formation of the clogging mat. It has also been found that the nature of the suspended solids is also an important factor. The finer particles from extended aeration systems when compared to those from septic tanks will penetrate deeper into the soil and hence will ultimately cause a more dense clogging mat. However the importance of improved pretreatment in clogging mat formation may need to be qualified in view of other research studies. It has also shown that effluent quality may be a factor in the case of highly permeable soils but this may not be the case with fine structured soils. The ultimate test of onsite sewage treatment system efficiency rests with the final disposal of effluent. The implication of system failure as evidenced from the surface ponding of effluent or the seepage of contaminants into the groundwater can be very serious as it can lead to environmental and public health impacts. Significant microbial contamination of surface and groundwater has been attributed to septic tank effluent. There are a number of documented instances of septic tank related waterborne disease outbreaks affecting large numbers of people. In a recent incident, the local authority was found liable for an outbreak of viral hepatitis A and not the individual septic tank owners as no action had been taken to remedy septic tank failure. This illustrates the responsibility placed on local authorities in terms of ensuring the proper operation of onsite sewage treatment systems. Even a properly functioning soil absorption system is only capable of removing phosphorus and microorganisms. The nitrogen remaining after plant uptake will not be retained in the soil column, but will instead gradually seep into the groundwater as nitrate. Conditions for nitrogen removal by denitrification are not generally present in a soil absorption bed. Dilution by groundwater is the only treatment available for reducing the nitrogen concentration to specified levels. Therefore based on subsurface conditions, this essentially entails a maximum allowable concentration of septic tanks in a given area. Unfortunately nitrogen is not the only wastewater pollutant of concern. Relatively long survival times and travel distances have been noted for microorganisms originating from soil absorption systems. This is likely to happen if saturated conditions persist under the soil absorption bed or due to surface runoff of effluent as a result of system failure. Soils have a finite capacity for the removal of phosphorus. Once this capacity is exceeded, phosphorus too will seep into the groundwater. The relatively high mobility of phosphorus in sandy soils have been noted in the literature. These issues have serious implications in the design and siting of soil absorption systems. It is not only important to ensure that the system design is based on subsurface conditions but also the density of these systems in given areas is a critical issue. This essentially involves the adoption of a land capability approach to determine the limitations of an individual site for onsite sewage disposal. The most limiting factor at a particular site would determine the overall capability classification for that site which would also dictate the type of effluent disposal method to be adopted.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Dengue fever is one of the world’s most important vector-borne diseases. The transmission area of this disease continues to expand due to many factors including urban sprawl, increased travel and global warming. Current preventative techniques are primarily based on controlling mosquito vectors as other prophylactic measures, such as a tetravalent vaccine are unlikely to be available in the foreseeable future. However, the continually increasing dengue incidence suggests that this strategy alone is not sufficient. Epidemiological models attempt to predict future outbreaks using information on the risk factors of the disease. Through a systematic literature review, this paper aims at analyzing the different modeling methods and their outputs in terms of accurately predicting disease outbreaks. We found that many previous studies have not sufficiently accounted for the spatio-temporal features of the disease in the modeling process. Yet with advances in technology, the ability to incorporate such information as well as the socio-environmental aspect allowed for its use as an early warning system, albeit limited geographically to a local scale.